Deep Learning Part- I with Mohammad Wasiq¶

Uses of Deep Learning

  1. Healthcare
  2. Market Price
  3. Facial Recognition
  4. Speech Recognition
  5. Image Recognition
  6. Online Advertisement
  7. Photo Tagging
  8. Autonomous Drivig

$$Deep\,\, Learning$$¶

$$\Downarrow$$¶

$$Neural\,\, Network$$¶

$$\text{Functional unit of Deep Learning}$$
  • Algorithm: Automated Instructions
  • Artificial Intelligence: Programs with the ability to mimic human behavior.
  • Machine Learning: Algorithm with the ability to learn without being explicitly programmed.
  • Deep Learning: Subset of machine learning in which artificial neural networks adapt and learn from vast amounts of data.

$$Data+Rule \Rightarrow Programming \Rightarrow Output$$¶

$$Data+Output \Rightarrow Machine\,\, Learning \Rightarrow Rule$$¶

In [ ]:
import numpy as np
import pandas as pd
import tensorflow as tf
from tensorflow import keras
In [ ]:
print(tf.__version__)
2.9.2
In [ ]:
 # Simplest sequential layer neural network
 model= tf.keras.Sequential([keras.layers.Dense(units=1, input_shape=[1])])
In [ ]:
model.compile(optimizer='sgd', loss= 'mean_squared_error')
In [ ]:
# numpy array for data
xs= np.array([-1.0, 0.0, 1.0, 2.0, 3.0, 4.0], dtype= float)
ys= np.array([-3.0, -1.0, 1.0, 3.0, 5.0, 7.0], dtype=float)
In [ ]:
# Training of neural network
model.fit(xs, ys, epochs=500)
Epoch 1/500
1/1 [==============================] - 1s 548ms/step - loss: 52.0427
Epoch 2/500
1/1 [==============================] - 0s 8ms/step - loss: 41.3465
Epoch 3/500
1/1 [==============================] - 0s 12ms/step - loss: 32.9230
Epoch 4/500
1/1 [==============================] - 0s 10ms/step - loss: 26.2877
Epoch 5/500
1/1 [==============================] - 0s 11ms/step - loss: 21.0594
Epoch 6/500
1/1 [==============================] - 0s 10ms/step - loss: 16.9383
Epoch 7/500
1/1 [==============================] - 0s 8ms/step - loss: 13.6884
Epoch 8/500
1/1 [==============================] - 0s 8ms/step - loss: 11.1242
Epoch 9/500
1/1 [==============================] - 0s 6ms/step - loss: 9.0994
Epoch 10/500
1/1 [==============================] - 0s 7ms/step - loss: 7.4993
Epoch 11/500
1/1 [==============================] - 0s 7ms/step - loss: 6.2334
Epoch 12/500
1/1 [==============================] - 0s 7ms/step - loss: 5.2307
Epoch 13/500
1/1 [==============================] - 0s 19ms/step - loss: 4.4350
Epoch 14/500
1/1 [==============================] - 0s 12ms/step - loss: 3.8025
Epoch 15/500
1/1 [==============================] - 0s 9ms/step - loss: 3.2984
Epoch 16/500
1/1 [==============================] - 0s 10ms/step - loss: 2.8955
Epoch 17/500
1/1 [==============================] - 0s 8ms/step - loss: 2.5723
Epoch 18/500
1/1 [==============================] - 0s 8ms/step - loss: 2.3121
Epoch 19/500
1/1 [==============================] - 0s 6ms/step - loss: 2.1014
Epoch 20/500
1/1 [==============================] - 0s 9ms/step - loss: 1.9298
Epoch 21/500
1/1 [==============================] - 0s 7ms/step - loss: 1.7891
Epoch 22/500
1/1 [==============================] - 0s 8ms/step - loss: 1.6729
Epoch 23/500
1/1 [==============================] - 0s 7ms/step - loss: 1.5760
Epoch 24/500
1/1 [==============================] - 0s 9ms/step - loss: 1.4944
Epoch 25/500
1/1 [==============================] - 0s 6ms/step - loss: 1.4250
Epoch 26/500
1/1 [==============================] - 0s 7ms/step - loss: 1.3653
Epoch 27/500
1/1 [==============================] - 0s 11ms/step - loss: 1.3133
Epoch 28/500
1/1 [==============================] - 0s 8ms/step - loss: 1.2675
Epoch 29/500
1/1 [==============================] - 0s 14ms/step - loss: 1.2266
Epoch 30/500
1/1 [==============================] - 0s 8ms/step - loss: 1.1898
Epoch 31/500
1/1 [==============================] - 0s 10ms/step - loss: 1.1561
Epoch 32/500
1/1 [==============================] - 0s 12ms/step - loss: 1.1252
Epoch 33/500
1/1 [==============================] - 0s 9ms/step - loss: 1.0964
Epoch 34/500
1/1 [==============================] - 0s 8ms/step - loss: 1.0694
Epoch 35/500
1/1 [==============================] - 0s 7ms/step - loss: 1.0439
Epoch 36/500
1/1 [==============================] - 0s 8ms/step - loss: 1.0197
Epoch 37/500
1/1 [==============================] - 0s 8ms/step - loss: 0.9966
Epoch 38/500
1/1 [==============================] - 0s 9ms/step - loss: 0.9744
Epoch 39/500
1/1 [==============================] - 0s 7ms/step - loss: 0.9530
Epoch 40/500
1/1 [==============================] - 0s 8ms/step - loss: 0.9324
Epoch 41/500
1/1 [==============================] - 0s 9ms/step - loss: 0.9124
Epoch 42/500
1/1 [==============================] - 0s 8ms/step - loss: 0.8930
Epoch 43/500
1/1 [==============================] - 0s 8ms/step - loss: 0.8742
Epoch 44/500
1/1 [==============================] - 0s 7ms/step - loss: 0.8558
Epoch 45/500
1/1 [==============================] - 0s 7ms/step - loss: 0.8379
Epoch 46/500
1/1 [==============================] - 0s 11ms/step - loss: 0.8204
Epoch 47/500
1/1 [==============================] - 0s 6ms/step - loss: 0.8034
Epoch 48/500
1/1 [==============================] - 0s 13ms/step - loss: 0.7867
Epoch 49/500
1/1 [==============================] - 0s 11ms/step - loss: 0.7704
Epoch 50/500
1/1 [==============================] - 0s 9ms/step - loss: 0.7545
Epoch 51/500
1/1 [==============================] - 0s 7ms/step - loss: 0.7389
Epoch 52/500
1/1 [==============================] - 0s 8ms/step - loss: 0.7237
Epoch 53/500
1/1 [==============================] - 0s 7ms/step - loss: 0.7088
Epoch 54/500
1/1 [==============================] - 0s 8ms/step - loss: 0.6942
Epoch 55/500
1/1 [==============================] - 0s 8ms/step - loss: 0.6799
Epoch 56/500
1/1 [==============================] - 0s 9ms/step - loss: 0.6659
Epoch 57/500
1/1 [==============================] - 0s 7ms/step - loss: 0.6522
Epoch 58/500
1/1 [==============================] - 0s 8ms/step - loss: 0.6388
Epoch 59/500
1/1 [==============================] - 0s 8ms/step - loss: 0.6257
Epoch 60/500
1/1 [==============================] - 0s 7ms/step - loss: 0.6128
Epoch 61/500
1/1 [==============================] - 0s 7ms/step - loss: 0.6002
Epoch 62/500
1/1 [==============================] - 0s 7ms/step - loss: 0.5879
Epoch 63/500
1/1 [==============================] - 0s 7ms/step - loss: 0.5758
Epoch 64/500
1/1 [==============================] - 0s 8ms/step - loss: 0.5640
Epoch 65/500
1/1 [==============================] - 0s 10ms/step - loss: 0.5524
Epoch 66/500
1/1 [==============================] - 0s 7ms/step - loss: 0.5410
Epoch 67/500
1/1 [==============================] - 0s 7ms/step - loss: 0.5299
Epoch 68/500
1/1 [==============================] - 0s 7ms/step - loss: 0.5190
Epoch 69/500
1/1 [==============================] - 0s 7ms/step - loss: 0.5084
Epoch 70/500
1/1 [==============================] - 0s 8ms/step - loss: 0.4979
Epoch 71/500
1/1 [==============================] - 0s 8ms/step - loss: 0.4877
Epoch 72/500
1/1 [==============================] - 0s 7ms/step - loss: 0.4777
Epoch 73/500
1/1 [==============================] - 0s 7ms/step - loss: 0.4679
Epoch 74/500
1/1 [==============================] - 0s 11ms/step - loss: 0.4583
Epoch 75/500
1/1 [==============================] - 0s 7ms/step - loss: 0.4489
Epoch 76/500
1/1 [==============================] - 0s 6ms/step - loss: 0.4396
Epoch 77/500
1/1 [==============================] - 0s 6ms/step - loss: 0.4306
Epoch 78/500
1/1 [==============================] - 0s 6ms/step - loss: 0.4218
Epoch 79/500
1/1 [==============================] - 0s 7ms/step - loss: 0.4131
Epoch 80/500
1/1 [==============================] - 0s 7ms/step - loss: 0.4046
Epoch 81/500
1/1 [==============================] - 0s 8ms/step - loss: 0.3963
Epoch 82/500
1/1 [==============================] - 0s 8ms/step - loss: 0.3882
Epoch 83/500
1/1 [==============================] - 0s 8ms/step - loss: 0.3802
Epoch 84/500
1/1 [==============================] - 0s 14ms/step - loss: 0.3724
Epoch 85/500
1/1 [==============================] - 0s 8ms/step - loss: 0.3647
Epoch 86/500
1/1 [==============================] - 0s 8ms/step - loss: 0.3572
Epoch 87/500
1/1 [==============================] - 0s 7ms/step - loss: 0.3499
Epoch 88/500
1/1 [==============================] - 0s 16ms/step - loss: 0.3427
Epoch 89/500
1/1 [==============================] - 0s 8ms/step - loss: 0.3357
Epoch 90/500
1/1 [==============================] - 0s 8ms/step - loss: 0.3288
Epoch 91/500
1/1 [==============================] - 0s 7ms/step - loss: 0.3220
Epoch 92/500
1/1 [==============================] - 0s 10ms/step - loss: 0.3154
Epoch 93/500
1/1 [==============================] - 0s 7ms/step - loss: 0.3089
Epoch 94/500
1/1 [==============================] - 0s 10ms/step - loss: 0.3026
Epoch 95/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2964
Epoch 96/500
1/1 [==============================] - 0s 9ms/step - loss: 0.2903
Epoch 97/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2843
Epoch 98/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2785
Epoch 99/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2728
Epoch 100/500
1/1 [==============================] - 0s 8ms/step - loss: 0.2672
Epoch 101/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2617
Epoch 102/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2563
Epoch 103/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2510
Epoch 104/500
1/1 [==============================] - 0s 8ms/step - loss: 0.2459
Epoch 105/500
1/1 [==============================] - 0s 8ms/step - loss: 0.2408
Epoch 106/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2359
Epoch 107/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2310
Epoch 108/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2263
Epoch 109/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2216
Epoch 110/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2171
Epoch 111/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2126
Epoch 112/500
1/1 [==============================] - 0s 10ms/step - loss: 0.2083
Epoch 113/500
1/1 [==============================] - 0s 7ms/step - loss: 0.2040
Epoch 114/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1998
Epoch 115/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1957
Epoch 116/500
1/1 [==============================] - 0s 8ms/step - loss: 0.1917
Epoch 117/500
1/1 [==============================] - 0s 8ms/step - loss: 0.1877
Epoch 118/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1839
Epoch 119/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1801
Epoch 120/500
1/1 [==============================] - 0s 8ms/step - loss: 0.1764
Epoch 121/500
1/1 [==============================] - 0s 8ms/step - loss: 0.1728
Epoch 122/500
1/1 [==============================] - 0s 9ms/step - loss: 0.1692
Epoch 123/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1657
Epoch 124/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1623
Epoch 125/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1590
Epoch 126/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1557
Epoch 127/500
1/1 [==============================] - 0s 14ms/step - loss: 0.1525
Epoch 128/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1494
Epoch 129/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1463
Epoch 130/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1433
Epoch 131/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1404
Epoch 132/500
1/1 [==============================] - 0s 10ms/step - loss: 0.1375
Epoch 133/500
1/1 [==============================] - 0s 8ms/step - loss: 0.1347
Epoch 134/500
1/1 [==============================] - 0s 8ms/step - loss: 0.1319
Epoch 135/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1292
Epoch 136/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1266
Epoch 137/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1240
Epoch 138/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1214
Epoch 139/500
1/1 [==============================] - 0s 8ms/step - loss: 0.1189
Epoch 140/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1165
Epoch 141/500
1/1 [==============================] - 0s 12ms/step - loss: 0.1141
Epoch 142/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1117
Epoch 143/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1094
Epoch 144/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1072
Epoch 145/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1050
Epoch 146/500
1/1 [==============================] - 0s 7ms/step - loss: 0.1028
Epoch 147/500
1/1 [==============================] - 0s 8ms/step - loss: 0.1007
Epoch 148/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0987
Epoch 149/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0966
Epoch 150/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0946
Epoch 151/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0927
Epoch 152/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0908
Epoch 153/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0889
Epoch 154/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0871
Epoch 155/500
1/1 [==============================] - 0s 37ms/step - loss: 0.0853
Epoch 156/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0836
Epoch 157/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0818
Epoch 158/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0802
Epoch 159/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0785
Epoch 160/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0769
Epoch 161/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0753
Epoch 162/500
1/1 [==============================] - 0s 50ms/step - loss: 0.0738
Epoch 163/500
1/1 [==============================] - 0s 19ms/step - loss: 0.0723
Epoch 164/500
1/1 [==============================] - 0s 146ms/step - loss: 0.0708
Epoch 165/500
1/1 [==============================] - 0s 31ms/step - loss: 0.0693
Epoch 166/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0679
Epoch 167/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0665
Epoch 168/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0651
Epoch 169/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0638
Epoch 170/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0625
Epoch 171/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0612
Epoch 172/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0599
Epoch 173/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0587
Epoch 174/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0575
Epoch 175/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0563
Epoch 176/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0552
Epoch 177/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0540
Epoch 178/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0529
Epoch 179/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0518
Epoch 180/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0508
Epoch 181/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0497
Epoch 182/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0487
Epoch 183/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0477
Epoch 184/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0467
Epoch 185/500
1/1 [==============================] - 0s 18ms/step - loss: 0.0458
Epoch 186/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0448
Epoch 187/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0439
Epoch 188/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0430
Epoch 189/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0421
Epoch 190/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0413
Epoch 191/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0404
Epoch 192/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0396
Epoch 193/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0388
Epoch 194/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0380
Epoch 195/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0372
Epoch 196/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0364
Epoch 197/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0357
Epoch 198/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0349
Epoch 199/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0342
Epoch 200/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0335
Epoch 201/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0328
Epoch 202/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0322
Epoch 203/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0315
Epoch 204/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0309
Epoch 205/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0302
Epoch 206/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0296
Epoch 207/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0290
Epoch 208/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0284
Epoch 209/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0278
Epoch 210/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0272
Epoch 211/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0267
Epoch 212/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0261
Epoch 213/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0256
Epoch 214/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0251
Epoch 215/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0246
Epoch 216/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0241
Epoch 217/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0236
Epoch 218/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0231
Epoch 219/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0226
Epoch 220/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0221
Epoch 221/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0217
Epoch 222/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0212
Epoch 223/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0208
Epoch 224/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0204
Epoch 225/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0200
Epoch 226/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0195
Epoch 227/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0191
Epoch 228/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0188
Epoch 229/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0184
Epoch 230/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0180
Epoch 231/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0176
Epoch 232/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0173
Epoch 233/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0169
Epoch 234/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0166
Epoch 235/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0162
Epoch 236/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0159
Epoch 237/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0156
Epoch 238/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0152
Epoch 239/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0149
Epoch 240/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0146
Epoch 241/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0143
Epoch 242/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0140
Epoch 243/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0137
Epoch 244/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0135
Epoch 245/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0132
Epoch 246/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0129
Epoch 247/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0126
Epoch 248/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0124
Epoch 249/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0121
Epoch 250/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0119
Epoch 251/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0116
Epoch 252/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0114
Epoch 253/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0112
Epoch 254/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0109
Epoch 255/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0107
Epoch 256/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0105
Epoch 257/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0103
Epoch 258/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0101
Epoch 259/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0099
Epoch 260/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0097
Epoch 261/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0095
Epoch 262/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0093
Epoch 263/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0091
Epoch 264/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0089
Epoch 265/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0087
Epoch 266/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0085
Epoch 267/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0083
Epoch 268/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0082
Epoch 269/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0080
Epoch 270/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0078
Epoch 271/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0077
Epoch 272/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0075
Epoch 273/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0074
Epoch 274/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0072
Epoch 275/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0071
Epoch 276/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0069
Epoch 277/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0068
Epoch 278/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0066
Epoch 279/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0065
Epoch 280/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0064
Epoch 281/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0062
Epoch 282/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0061
Epoch 283/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0060
Epoch 284/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0059
Epoch 285/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0057
Epoch 286/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0056
Epoch 287/500
1/1 [==============================] - 0s 37ms/step - loss: 0.0055
Epoch 288/500
1/1 [==============================] - 0s 15ms/step - loss: 0.0054
Epoch 289/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0053
Epoch 290/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0052
Epoch 291/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0051
Epoch 292/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0050
Epoch 293/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0049
Epoch 294/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0048
Epoch 295/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0047
Epoch 296/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0046
Epoch 297/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0045
Epoch 298/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0044
Epoch 299/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0043
Epoch 300/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0042
Epoch 301/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0041
Epoch 302/500
1/1 [==============================] - 0s 16ms/step - loss: 0.0040
Epoch 303/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0040
Epoch 304/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0039
Epoch 305/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0038
Epoch 306/500
1/1 [==============================] - 0s 15ms/step - loss: 0.0037
Epoch 307/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0036
Epoch 308/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0036
Epoch 309/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0035
Epoch 310/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0034
Epoch 311/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0033
Epoch 312/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0033
Epoch 313/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0032
Epoch 314/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0031
Epoch 315/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0031
Epoch 316/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0030
Epoch 317/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0030
Epoch 318/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0029
Epoch 319/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0028
Epoch 320/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0028
Epoch 321/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0027
Epoch 322/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0027
Epoch 323/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0026
Epoch 324/500
1/1 [==============================] - 0s 59ms/step - loss: 0.0026
Epoch 325/500
1/1 [==============================] - 0s 49ms/step - loss: 0.0025
Epoch 326/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0025
Epoch 327/500
1/1 [==============================] - 0s 18ms/step - loss: 0.0024
Epoch 328/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0024
Epoch 329/500
1/1 [==============================] - 0s 56ms/step - loss: 0.0023
Epoch 330/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0023
Epoch 331/500
1/1 [==============================] - 0s 32ms/step - loss: 0.0022
Epoch 332/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0022
Epoch 333/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0021
Epoch 334/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0021
Epoch 335/500
1/1 [==============================] - 0s 35ms/step - loss: 0.0020
Epoch 336/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0020
Epoch 337/500
1/1 [==============================] - 0s 55ms/step - loss: 0.0020
Epoch 338/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0019
Epoch 339/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0019
Epoch 340/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0018
Epoch 341/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0018
Epoch 342/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0018
Epoch 343/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0017
Epoch 344/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0017
Epoch 345/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0017
Epoch 346/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0016
Epoch 347/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0016
Epoch 348/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0016
Epoch 349/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0015
Epoch 350/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0015
Epoch 351/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0015
Epoch 352/500
1/1 [==============================] - 0s 23ms/step - loss: 0.0014
Epoch 353/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0014
Epoch 354/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0014
Epoch 355/500
1/1 [==============================] - 0s 16ms/step - loss: 0.0013
Epoch 356/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0013
Epoch 357/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0013
Epoch 358/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0013
Epoch 359/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0012
Epoch 360/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0012
Epoch 361/500
1/1 [==============================] - 0s 7ms/step - loss: 0.0012
Epoch 362/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0012
Epoch 363/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0011
Epoch 364/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0011
Epoch 365/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0011
Epoch 366/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0011
Epoch 367/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0010
Epoch 368/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0010
Epoch 369/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0010
Epoch 370/500
1/1 [==============================] - 0s 7ms/step - loss: 9.8425e-04
Epoch 371/500
1/1 [==============================] - 0s 8ms/step - loss: 9.6403e-04
Epoch 372/500
1/1 [==============================] - 0s 8ms/step - loss: 9.4423e-04
Epoch 373/500
1/1 [==============================] - 0s 10ms/step - loss: 9.2484e-04
Epoch 374/500
1/1 [==============================] - 0s 9ms/step - loss: 9.0584e-04
Epoch 375/500
1/1 [==============================] - 0s 8ms/step - loss: 8.8724e-04
Epoch 376/500
1/1 [==============================] - 0s 11ms/step - loss: 8.6901e-04
Epoch 377/500
1/1 [==============================] - 0s 8ms/step - loss: 8.5116e-04
Epoch 378/500
1/1 [==============================] - 0s 9ms/step - loss: 8.3368e-04
Epoch 379/500
1/1 [==============================] - 0s 9ms/step - loss: 8.1655e-04
Epoch 380/500
1/1 [==============================] - 0s 8ms/step - loss: 7.9978e-04
Epoch 381/500
1/1 [==============================] - 0s 10ms/step - loss: 7.8335e-04
Epoch 382/500
1/1 [==============================] - 0s 10ms/step - loss: 7.6727e-04
Epoch 383/500
1/1 [==============================] - 0s 10ms/step - loss: 7.5151e-04
Epoch 384/500
1/1 [==============================] - 0s 8ms/step - loss: 7.3607e-04
Epoch 385/500
1/1 [==============================] - 0s 9ms/step - loss: 7.2095e-04
Epoch 386/500
1/1 [==============================] - 0s 8ms/step - loss: 7.0614e-04
Epoch 387/500
1/1 [==============================] - 0s 8ms/step - loss: 6.9163e-04
Epoch 388/500
1/1 [==============================] - 0s 8ms/step - loss: 6.7743e-04
Epoch 389/500
1/1 [==============================] - 0s 8ms/step - loss: 6.6351e-04
Epoch 390/500
1/1 [==============================] - 0s 8ms/step - loss: 6.4989e-04
Epoch 391/500
1/1 [==============================] - 0s 9ms/step - loss: 6.3653e-04
Epoch 392/500
1/1 [==============================] - 0s 12ms/step - loss: 6.2346e-04
Epoch 393/500
1/1 [==============================] - 0s 10ms/step - loss: 6.1066e-04
Epoch 394/500
1/1 [==============================] - 0s 11ms/step - loss: 5.9811e-04
Epoch 395/500
1/1 [==============================] - 0s 7ms/step - loss: 5.8583e-04
Epoch 396/500
1/1 [==============================] - 0s 7ms/step - loss: 5.7379e-04
Epoch 397/500
1/1 [==============================] - 0s 8ms/step - loss: 5.6201e-04
Epoch 398/500
1/1 [==============================] - 0s 8ms/step - loss: 5.5046e-04
Epoch 399/500
1/1 [==============================] - 0s 8ms/step - loss: 5.3916e-04
Epoch 400/500
1/1 [==============================] - 0s 8ms/step - loss: 5.2808e-04
Epoch 401/500
1/1 [==============================] - 0s 14ms/step - loss: 5.1723e-04
Epoch 402/500
1/1 [==============================] - 0s 7ms/step - loss: 5.0661e-04
Epoch 403/500
1/1 [==============================] - 0s 7ms/step - loss: 4.9621e-04
Epoch 404/500
1/1 [==============================] - 0s 6ms/step - loss: 4.8601e-04
Epoch 405/500
1/1 [==============================] - 0s 7ms/step - loss: 4.7603e-04
Epoch 406/500
1/1 [==============================] - 0s 7ms/step - loss: 4.6625e-04
Epoch 407/500
1/1 [==============================] - 0s 6ms/step - loss: 4.5667e-04
Epoch 408/500
1/1 [==============================] - 0s 6ms/step - loss: 4.4729e-04
Epoch 409/500
1/1 [==============================] - 0s 7ms/step - loss: 4.3810e-04
Epoch 410/500
1/1 [==============================] - 0s 7ms/step - loss: 4.2911e-04
Epoch 411/500
1/1 [==============================] - 0s 8ms/step - loss: 4.2029e-04
Epoch 412/500
1/1 [==============================] - 0s 8ms/step - loss: 4.1166e-04
Epoch 413/500
1/1 [==============================] - 0s 7ms/step - loss: 4.0320e-04
Epoch 414/500
1/1 [==============================] - 0s 15ms/step - loss: 3.9492e-04
Epoch 415/500
1/1 [==============================] - 0s 8ms/step - loss: 3.8681e-04
Epoch 416/500
1/1 [==============================] - 0s 8ms/step - loss: 3.7887e-04
Epoch 417/500
1/1 [==============================] - 0s 7ms/step - loss: 3.7108e-04
Epoch 418/500
1/1 [==============================] - 0s 8ms/step - loss: 3.6346e-04
Epoch 419/500
1/1 [==============================] - 0s 8ms/step - loss: 3.5599e-04
Epoch 420/500
1/1 [==============================] - 0s 7ms/step - loss: 3.4868e-04
Epoch 421/500
1/1 [==============================] - 0s 9ms/step - loss: 3.4152e-04
Epoch 422/500
1/1 [==============================] - 0s 14ms/step - loss: 3.3450e-04
Epoch 423/500
1/1 [==============================] - 0s 9ms/step - loss: 3.2764e-04
Epoch 424/500
1/1 [==============================] - 0s 9ms/step - loss: 3.2091e-04
Epoch 425/500
1/1 [==============================] - 0s 8ms/step - loss: 3.1431e-04
Epoch 426/500
1/1 [==============================] - 0s 8ms/step - loss: 3.0786e-04
Epoch 427/500
1/1 [==============================] - 0s 8ms/step - loss: 3.0153e-04
Epoch 428/500
1/1 [==============================] - 0s 8ms/step - loss: 2.9534e-04
Epoch 429/500
1/1 [==============================] - 0s 7ms/step - loss: 2.8927e-04
Epoch 430/500
1/1 [==============================] - 0s 8ms/step - loss: 2.8333e-04
Epoch 431/500
1/1 [==============================] - 0s 17ms/step - loss: 2.7751e-04
Epoch 432/500
1/1 [==============================] - 0s 10ms/step - loss: 2.7181e-04
Epoch 433/500
1/1 [==============================] - 0s 11ms/step - loss: 2.6623e-04
Epoch 434/500
1/1 [==============================] - 0s 7ms/step - loss: 2.6076e-04
Epoch 435/500
1/1 [==============================] - 0s 100ms/step - loss: 2.5540e-04
Epoch 436/500
1/1 [==============================] - 0s 9ms/step - loss: 2.5016e-04
Epoch 437/500
1/1 [==============================] - 0s 9ms/step - loss: 2.4502e-04
Epoch 438/500
1/1 [==============================] - 0s 11ms/step - loss: 2.3998e-04
Epoch 439/500
1/1 [==============================] - 0s 9ms/step - loss: 2.3506e-04
Epoch 440/500
1/1 [==============================] - 0s 9ms/step - loss: 2.3023e-04
Epoch 441/500
1/1 [==============================] - 0s 10ms/step - loss: 2.2550e-04
Epoch 442/500
1/1 [==============================] - 0s 10ms/step - loss: 2.2087e-04
Epoch 443/500
1/1 [==============================] - 0s 9ms/step - loss: 2.1633e-04
Epoch 444/500
1/1 [==============================] - 0s 10ms/step - loss: 2.1189e-04
Epoch 445/500
1/1 [==============================] - 0s 10ms/step - loss: 2.0753e-04
Epoch 446/500
1/1 [==============================] - 0s 9ms/step - loss: 2.0327e-04
Epoch 447/500
1/1 [==============================] - 0s 9ms/step - loss: 1.9909e-04
Epoch 448/500
1/1 [==============================] - 0s 11ms/step - loss: 1.9501e-04
Epoch 449/500
1/1 [==============================] - 0s 9ms/step - loss: 1.9100e-04
Epoch 450/500
1/1 [==============================] - 0s 10ms/step - loss: 1.8708e-04
Epoch 451/500
1/1 [==============================] - 0s 9ms/step - loss: 1.8323e-04
Epoch 452/500
1/1 [==============================] - 0s 10ms/step - loss: 1.7947e-04
Epoch 453/500
1/1 [==============================] - 0s 8ms/step - loss: 1.7578e-04
Epoch 454/500
1/1 [==============================] - 0s 13ms/step - loss: 1.7217e-04
Epoch 455/500
1/1 [==============================] - 0s 11ms/step - loss: 1.6864e-04
Epoch 456/500
1/1 [==============================] - 0s 11ms/step - loss: 1.6517e-04
Epoch 457/500
1/1 [==============================] - 0s 11ms/step - loss: 1.6178e-04
Epoch 458/500
1/1 [==============================] - 0s 7ms/step - loss: 1.5846e-04
Epoch 459/500
1/1 [==============================] - 0s 6ms/step - loss: 1.5520e-04
Epoch 460/500
1/1 [==============================] - 0s 6ms/step - loss: 1.5201e-04
Epoch 461/500
1/1 [==============================] - 0s 6ms/step - loss: 1.4889e-04
Epoch 462/500
1/1 [==============================] - 0s 10ms/step - loss: 1.4583e-04
Epoch 463/500
1/1 [==============================] - 0s 32ms/step - loss: 1.4284e-04
Epoch 464/500
1/1 [==============================] - 0s 8ms/step - loss: 1.3990e-04
Epoch 465/500
1/1 [==============================] - 0s 9ms/step - loss: 1.3703e-04
Epoch 466/500
1/1 [==============================] - 0s 9ms/step - loss: 1.3421e-04
Epoch 467/500
1/1 [==============================] - 0s 8ms/step - loss: 1.3146e-04
Epoch 468/500
1/1 [==============================] - 0s 10ms/step - loss: 1.2876e-04
Epoch 469/500
1/1 [==============================] - 0s 8ms/step - loss: 1.2611e-04
Epoch 470/500
1/1 [==============================] - 0s 8ms/step - loss: 1.2352e-04
Epoch 471/500
1/1 [==============================] - 0s 8ms/step - loss: 1.2098e-04
Epoch 472/500
1/1 [==============================] - 0s 12ms/step - loss: 1.1850e-04
Epoch 473/500
1/1 [==============================] - 0s 7ms/step - loss: 1.1606e-04
Epoch 474/500
1/1 [==============================] - 0s 12ms/step - loss: 1.1368e-04
Epoch 475/500
1/1 [==============================] - 0s 9ms/step - loss: 1.1135e-04
Epoch 476/500
1/1 [==============================] - 0s 9ms/step - loss: 1.0906e-04
Epoch 477/500
1/1 [==============================] - 0s 9ms/step - loss: 1.0682e-04
Epoch 478/500
1/1 [==============================] - 0s 12ms/step - loss: 1.0463e-04
Epoch 479/500
1/1 [==============================] - 0s 9ms/step - loss: 1.0248e-04
Epoch 480/500
1/1 [==============================] - 0s 10ms/step - loss: 1.0037e-04
Epoch 481/500
1/1 [==============================] - 0s 7ms/step - loss: 9.8308e-05
Epoch 482/500
1/1 [==============================] - 0s 12ms/step - loss: 9.6288e-05
Epoch 483/500
1/1 [==============================] - 0s 9ms/step - loss: 9.4309e-05
Epoch 484/500
1/1 [==============================] - 0s 10ms/step - loss: 9.2373e-05
Epoch 485/500
1/1 [==============================] - 0s 14ms/step - loss: 9.0477e-05
Epoch 486/500
1/1 [==============================] - 0s 8ms/step - loss: 8.8617e-05
Epoch 487/500
1/1 [==============================] - 0s 9ms/step - loss: 8.6797e-05
Epoch 488/500
1/1 [==============================] - 0s 8ms/step - loss: 8.5014e-05
Epoch 489/500
1/1 [==============================] - 0s 9ms/step - loss: 8.3268e-05
Epoch 490/500
1/1 [==============================] - 0s 10ms/step - loss: 8.1557e-05
Epoch 491/500
1/1 [==============================] - 0s 9ms/step - loss: 7.9883e-05
Epoch 492/500
1/1 [==============================] - 0s 9ms/step - loss: 7.8241e-05
Epoch 493/500
1/1 [==============================] - 0s 10ms/step - loss: 7.6634e-05
Epoch 494/500
1/1 [==============================] - 0s 9ms/step - loss: 7.5060e-05
Epoch 495/500
1/1 [==============================] - 0s 8ms/step - loss: 7.3518e-05
Epoch 496/500
1/1 [==============================] - 0s 8ms/step - loss: 7.2009e-05
Epoch 497/500
1/1 [==============================] - 0s 12ms/step - loss: 7.0530e-05
Epoch 498/500
1/1 [==============================] - 0s 14ms/step - loss: 6.9081e-05
Epoch 499/500
1/1 [==============================] - 0s 8ms/step - loss: 6.7662e-05
Epoch 500/500
1/1 [==============================] - 0s 7ms/step - loss: 6.6273e-05
Out[ ]:
<keras.callbacks.History at 0x7fc0a37ba940>

$$Y= 2X-1$$¶

In [ ]:
print(model.predict([10]))
1/1 [==============================] - 0s 92ms/step
[[18.976248]]
In [ ]:
print(model.predict([10, 20, 30, 44, 45.5]))
1/1 [==============================] - 0s 78ms/step
[[18.976248]
 [38.941826]
 [58.907402]
 [86.85921 ]
 [89.85404 ]]

Fashion MNIST Data¶

In [ ]:
fashion_mnist= tf.keras.datasets.fashion_mnist
(train_image, train_labels), (test_image, test_labels)= fashion_mnist.load_data()
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz
29515/29515 [==============================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz
26421880/26421880 [==============================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz
5148/5148 [==============================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz
4422102/4422102 [==============================] - 0s 0us/step
In [ ]:
import numpy as np
import matplotlib.pyplot as plt

# We can put between 0 to 59999 here
index=0

# Set number of characters per row when printing
np.set_printoptions(linewidth= 320)

# Print the label and image
print(f'LABEL:{train_labels[index]}')
print(f'\nIMAGE PIXEL ARRAY:\n {train_image[index]}')

# Visualize the image
plt.imshow(train_image[index]) 
LABEL:9

IMAGE PIXEL ARRAY:
 [[  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   1   0   0  13  73   0   0   1   4   0   0   0   0   1   1   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   3   0  36 136 127  62  54   0   0   0   1   3   4   0   0   3]
 [  0   0   0   0   0   0   0   0   0   0   0   0   6   0 102 204 176 134 144 123  23   0   0   0   0  12  10   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0 155 236 207 178 107 156 161 109  64  23  77 130  72  15]
 [  0   0   0   0   0   0   0   0   0   0   0   1   0  69 207 223 218 216 216 163 127 121 122 146 141  88 172  66]
 [  0   0   0   0   0   0   0   0   0   1   1   1   0 200 232 232 233 229 223 223 215 213 164 127 123 196 229   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0 183 225 216 223 228 235 227 224 222 224 221 223 245 173   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0 193 228 218 213 198 180 212 210 211 213 223 220 243 202   0]
 [  0   0   0   0   0   0   0   0   0   1   3   0  12 219 220 212 218 192 169 227 208 218 224 212 226 197 209  52]
 [  0   0   0   0   0   0   0   0   0   0   6   0  99 244 222 220 218 203 198 221 215 213 222 220 245 119 167  56]
 [  0   0   0   0   0   0   0   0   0   4   0   0  55 236 228 230 228 240 232 213 218 223 234 217 217 209  92   0]
 [  0   0   1   4   6   7   2   0   0   0   0   0 237 226 217 223 222 219 222 221 216 223 229 215 218 255  77   0]
 [  0   3   0   0   0   0   0   0   0  62 145 204 228 207 213 221 218 208 211 218 224 223 219 215 224 244 159   0]
 [  0   0   0   0  18  44  82 107 189 228 220 222 217 226 200 205 211 230 224 234 176 188 250 248 233 238 215   0]
 [  0  57 187 208 224 221 224 208 204 214 208 209 200 159 245 193 206 223 255 255 221 234 221 211 220 232 246   0]
 [  3 202 228 224 221 211 211 214 205 205 205 220 240  80 150 255 229 221 188 154 191 210 204 209 222 228 225   0]
 [ 98 233 198 210 222 229 229 234 249 220 194 215 217 241  65  73 106 117 168 219 221 215 217 223 223 224 229  29]
 [ 75 204 212 204 193 205 211 225 216 185 197 206 198 213 240 195 227 245 239 223 218 212 209 222 220 221 230  67]
 [ 48 203 183 194 213 197 185 190 194 192 202 214 219 221 220 236 225 216 199 206 186 181 177 172 181 205 206 115]
 [  0 122 219 193 179 171 183 196 204 210 213 207 211 210 200 196 194 191 195 191 198 192 176 156 167 177 210  92]
 [  0   0  74 189 212 191 175 172 175 181 185 188 189 188 193 198 204 209 210 210 211 188 188 194 192 216 170   0]
 [  2   0   0   0  66 200 222 237 239 242 246 243 244 221 220 193 191 179 182 182 181 176 166 168  99  58   0   0]
 [  0   0   0   0   0   0   0  40  61  44  72  41  35   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]]
Out[ ]:
<matplotlib.image.AxesImage at 0x7fc09d01b610>
In [ ]:
import numpy as np
import matplotlib.pyplot as plt

# We can put between 0 to 59999 here
index=1

# Set number of characters per row when printing
np.set_printoptions(linewidth= 320)

# Print the label and image
print(f'LABEL:{train_labels[index]}')
print(f'\nIMAGE PIXEL ARRAY:\n {train_image[index]}')

# Visualize the image
plt.imshow(train_image[index]) 
LABEL:0

IMAGE PIXEL ARRAY:
 [[  0   0   0   0   0   1   0   0   0   0  41 188 103  54  48  43  87 168 133  16   0   0   0   0   0   0   0   0]
 [  0   0   0   1   0   0   0  49 136 219 216 228 236 255 255 255 255 217 215 254 231 160  45   0   0   0   0   0]
 [  0   0   0   0   0  14 176 222 224 212 203 198 196 200 215 204 202 201 201 201 209 218 224 164   0   0   0   0]
 [  0   0   0   0   0 188 219 200 198 202 198 199 199 201 196 198 198 200 200 200 200 201 200 225  41   0   0   0]
 [  0   0   0   0  51 219 199 203 203 212 238 248 250 245 249 246 247 252 248 235 207 203 203 222 140   0   0   0]
 [  0   0   0   0 116 226 206 204 207 204 101  75  47  73  48  50  45  51  63 113 222 202 206 220 224   0   0   0]
 [  0   0   0   0 200 222 209 203 215 200   0  70  98   0 103  59  68  71  49   0 219 206 214 210 250  38   0   0]
 [  0   0   0   0 247 218 212 210 215 214   0 254 243 139 255 174 251 255 205   0 215 217 214 208 220  95   0   0]
 [  0   0   0  45 226 214 214 215 224 205   0  42  35  60  16  17  12  13  70   0 189 216 212 206 212 156   0   0]
 [  0   0   0 164 235 214 211 220 216 201  52  71  89  94  83  78  70  76  92  87 206 207 222 213 219 208   0   0]
 [  0   0   0 106 187 223 237 248 211 198 252 250 248 245 248 252 253 250 252 239 201 212 225 215 193 113   0   0]
 [  0   0   0   0   0  17  54 159 222 193 208 192 197 200 200 200 200 201 203 195 210 165   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0  47 225 192 214 203 206 204 204 205 206 204 212 197 218 107   0   0   0   0   0   0]
 [  0   0   0   0   1   6   0  46 212 195 212 202 206 205 204 205 206 204 212 200 218  91   0   3   1   0   0   0]
 [  0   0   0   0   0   1   0  11 197 199 205 202 205 206 204 205 207 204 205 205 218  77   0   5   0   0   0   0]
 [  0   0   0   0   0   3   0   2 191 198 201 205 206 205 205 206 209 206 199 209 219  74   0   5   0   0   0   0]
 [  0   0   0   0   0   2   0   0 188 197 200 207 207 204 207 207 210 208 198 207 221  72   0   4   0   0   0   0]
 [  0   0   0   0   0   2   0   0 215 198 203 206 208 205 207 207 210 208 200 202 222  75   0   4   0   0   0   0]
 [  0   0   0   0   0   1   0   0 212 198 209 206 209 206 208 207 211 206 205 198 221  80   0   3   0   0   0   0]
 [  0   0   0   0   0   1   0   0 204 201 205 208 207 205 211 205 210 210 209 195 221  96   0   3   0   0   0   0]
 [  0   0   0   0   0   1   0   0 202 201 205 209 207 205 213 206 210 209 210 194 217 105   0   2   0   0   0   0]
 [  0   0   0   0   0   1   0   0 204 204 205 208 207 205 215 207 210 208 211 193 213 115   0   2   0   0   0   0]
 [  0   0   0   0   0   0   0   0 204 207 207 208 206 206 215 210 210 207 212 195 210 118   0   2   0   0   0   0]
 [  0   0   0   0   0   1   0   0 198 208 208 208 204 207 212 212 210 207 211 196 207 121   0   1   0   0   0   0]
 [  0   0   0   0   0   1   0   0 198 210 207 208 206 209 213 212 211 207 210 197 207 124   0   1   0   0   0   0]
 [  0   0   0   0   0   0   0   0 172 210 203 201 199 204 207 205 204 201 205 197 206 127   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0 188 221 214 234 236 238 244 244 244 240 243 214 224 162   0   2   0   0   0   0]
 [  0   0   0   0   0   1   0   0 139 146 130 135 135 137 125 124 125 121 119 114 130  76   0   0   0   0   0   0]]
Out[ ]:
<matplotlib.image.AxesImage at 0x7fc09cb56d00>
In [ ]:
# Normalize the pixel values of the train and test images (0-1) range
training_images= train_image/255.0
test_image= test_image/255.0
In [ ]:
model= keras.Sequential([
    keras.layers.Flatten(),
    keras.layers.Dense(128, activation= tf.nn.relu),
    keras.layers.Dense(10, activation= tf.nn.softmax)
])
model= keras.Sequential([
    keras.layers.Flatten(input_shape= (28, 28)),
    keras.layers.Dense(128, activation= tf.nn.relu),
    keras.layers.Dense(10, activation= tf.nn.softmax)
])
  1. Flatter layer with input shape of data/image size of pixels as mentioned in computer vision series.

$\quad\,\,\,\,$ Flatten converts $28 \times 28$ into a simple linear layer.

  1. Dense: $10$ Neurons becsuse we have $10$ classes of data labels
In [ ]:
# Declare teh sampel inputs and convert to a tensor
inputs= np.array([[1.0, 3.0, 4.0, 2.0]])
inputs= tf.convert_to_tensor(inputs)
print(f'input to softmax function: {inputs.numpy()}')

# Feed the inputs to a softmax activation function
outputs= tf.keras.activations.softmax(inputs)
print(f'output to softmax function: {outputs.numpy()}')

# Get teh sum of all values after the softmax
sum= tf.reduce_sum(outputs)
print(f'sum of outputs: {sum}')

# Get the index with highest value
prediction= np.argmax(outputs)
print(f'class with highest probability: {prediction}')
input to softmax function: [[1. 3. 4. 2.]]
output to softmax function: [[0.0320586  0.23688282 0.64391426 0.08714432]]
sum of outputs: 1.0
class with highest probability: 2
In [ ]:
model.compile(optimizer= tf.optimizers.Adam(),
              loss= 'sparse_categorical_crossentropy',
              metrics= ['accuracy'])

model.fit(training_images, train_labels, epochs= 5)
Epoch 1/5
1875/1875 [==============================] - 6s 3ms/step - loss: 0.4961 - accuracy: 0.8264
Epoch 2/5
1875/1875 [==============================] - 6s 3ms/step - loss: 0.3776 - accuracy: 0.8652
Epoch 3/5
1875/1875 [==============================] - 6s 3ms/step - loss: 0.3375 - accuracy: 0.8767
Epoch 4/5
1875/1875 [==============================] - 6s 3ms/step - loss: 0.3115 - accuracy: 0.8863
Epoch 5/5
1875/1875 [==============================] - 6s 3ms/step - loss: 0.2940 - accuracy: 0.8921
Out[ ]:
<keras.callbacks.History at 0x7fc09cafde50>
In [ ]:
# Evaluate the model on unseen data
model.evaluate(test_image, test_labels)
313/313 [==============================] - 2s 4ms/step - loss: 0.3752 - accuracy: 0.8611
Out[ ]:
[0.37521636486053467, 0.8611000180244446]

Supervised Deep Learning using Neural Network¶

1. Standard Neural Network¶

Structured Data

2. Convolutional Neural Network (CNN)¶

Unstructured Data (Image/Color)

3. Recurrent Neural Network (RNN)¶

Activation Functions¶

What is activation functions ?¶

An Activation function is a critical part of the design of a neural network.

An activation function in a neural network defines how the weighted sum of the input is transformed into an output from a node or nodes in a layer of the network.

Sometimes the activation function is called a "transfer function".

If the output range of the activation function is limited then it may be called a "squashing function".

Many activations are nonlinear and may be referred to as the "nonlinearity" in the layer or the network design.

Screenshot (74).png

Types of Activation Function¶

Screenshot (75).png

Why Activation functions ?¶

  • Tonnes of data is available

    • Data is non-structured.
    • No defined line between useful and non-useful data.
    • Noisy data will not give required output.

    $$\text{Activation functions help us to deal with it.}$$ Activation functions are mathematical equations that determine the output of a neural network model. They help the network to use the important information and suppress the noise.

    • In a neural network, activation functions are utilized to bring non-linearities into the decision border.
    • The goal of introducing nonlinearities in data is to simulate a real-world situation.
$$\text{Almost much of the data we deal with in real life is nonlinear. This is what makes neural networks so effective.}$$

Where exactly Activation functions are used in neural network ?¶

A simple model of neuron commonly known as the perception. It can be divided into three parts:

Screenshot (77).png

i. Input layer
ii. Neuron
iii. Output Layer

We first calculate the weighted sum of the input i.e. $$\sum_{i=1}^n w_i*x_i$$

Then Activation function is applied as. $$\phi\Bigg(\sum_{i=1}^n w_i*x_i\Bigg)$$

The calculate value is then passed to the next layer if present.

$$z= \sum_i w_ix_i+ b$$

But:

  • Real life data is non-linear, we need non-linear equation.
  • Activation functions helps us to built that.

Linear Activation function¶

The linear activation function, also known as "no activation" or "identify function" (multiplied $x_1.0$), is where the activation is proportional to the input

The function doesn't do anything to the weighted sum of the input, it simply spits out the value it was given as shown previously. $$f(x)= x$$

A linear activation function has two major problems :

  1. It's not possible to use backpropagation as teh derivative of the function is a constant and has no relation to the input $x$.
  2. All layers of the neural network will collapse into one if a linear actiation function is used. No matter the number of layers in the neural network, the last layer will still be a linear function of the first layer. $$\text{So, essentially, a linear activation function turns the neural network into just one layer.}$$

Non-linear Activation functions¶

The limited power linear activation function (simply a linear regression model), this does not allow the model to create complex mappings between the network's inputs and outputs.

$\text{Non-linear activation functions solve the following limitations of linear activation functions}$:¶

  • They allow backpropagation because now the derivative function would be related to the input, and it's possible to go back and understand which weights in the input neurons can provide a better prediction.
  • They allow the stacking of multiple layers of neurons as the output would now be a non- linear combination of input passed through multiple layers. Any output can be represented as a functional computation in a neural network.

Binary Step Function¶

Bianry step function is a threshold-based function which means:

  • After a certain threshold neuron is activated
  • Below the said threshold neuron is deactivated.

In the graph, the threshold is zero. This activation can be used in binary classification as the name suggest, however it can not be used in a situation where you have multiple classes to deal with.

Screenshot (79).png

Sigmoid / Logistic Function¶

  • Smooth curve between $0$ and $1$.
  • This function takes any real value as input and outputs values in the range of $0$ to $1$.
  • The larger the input (more positive), the closer the output value will be $1.0$, whereas the smaller the input (more negative), the closer the output will be $0.0$, as shown bwlow.
$$Sigmoid/ Logistic$$$$f(x)= \frac{1}{1+e^{-x}}$$

where $e$ is a mathematical constant, which is the base of the natural logarithm.

Screenshot (80).png

Advantages:¶

Why sigmoid function is most widely used:

  • It is commonly used for models where we have to predict the probability as an output. Since probability of anything exists only between the range of $0$ and $1$, sigmoid is the right choice because of its range.
  • It normalizes the output of each neuron.
  • The function is differntiable and provides a smooth gradient, i.e. preventing jumps in output values.

This is represented by S-shape of the sigmoid activation function.

$$\text{However, Sigmoid function makes almost no change in the prediction for very high or very low inputs which ultimately results in neural network refusing to learn further, this problem is known as the vanishing gradient.}$$

Disadvantages:¶

  • Computationally expensive
  • Outputs not zero centered
  • Vanishing gradient --- for very high or very low values of $X$, there is almost no change to the prediction, causing a vanishing gradient problem. This can result in the network refusing to learn further, or being too slow to reach an accurate prediction.

Tanh Function (Hyperbolic Tangent)¶

  • Very similar to the sigmoid/logistic activation function.
  • Even has the same $S-shape$.
  • But the difference in output range of $-1$ to $1$.
  • In Tanh be to $1.0$, whereas the smaller the input (more negative), the closer the output will be to $-1.0$.
$$Tanh= f(x)= \frac{(e^x- e^{-x})}{(e^x+ e^{-x})}$$

where $e$ is a mathematical constant, which is the base of the natural logarithm.

Screenshot (81).png

Advantages:¶

  • Zero Centered
  • The prediction is simple, i.e. based on a threshold probability value.
$$\text{However, tanh also comes with the vanishing gradient problem just like sigmoid function.}$$

Rectified Linear Unit (ReLU) Function¶

  • In this function outputs for the positive inputs can range from $0$ to $\infty$.
  • Although it gives an immpression of a linear function, ReLU has a derivative function and allows for backpropagation while simultaneously making it computationally efficient.
  • The main catch here is that the ReLU function does not activate all the neurons at the same time.
  • The neurons will only be deactivated if the output of the linear transformation is less than $0$.
$$ReLU= f(x)= max(0, x)$$

Screenshot (82).png

Advantages:¶

  • Since only a certain number of neurons arectivated, the ReLU function is far more computationally efficient when compared to the sigmoid and tanh functions.
  • ReLU accelerates the convergence of gradient descent towards the global minimum of the loss function due to linear, non-saturating property.

Disadvantages:¶

  • When the input is zero or a negative value, the function outputs zero snd it hinders with the back-propagation. This problem is known as the dying ReLU problem.

Screenshot (83).png

Screenshot (84).png

Parametric ReLU (Rectified Linear Unit) Function¶

Advantages:¶

  • Parametric ReLU is another variant of ReLU that aims to solve the problem of gradient's becoming zero for the left half of the axis.
  • This function provides the slope of the negative part of the function as an argument $a$. By performing back-propagation, the most appropriate value of $a$ is learnt.
  • The parametric ReLU function is used when teh ReLU function still fails at solving the problem of dead neurons, and the relevent information is not successfully passed to the next layer.

Disadvantages:¶

  • This function's limitation is that it may perform differently for different problems depending upon the value of slope parameter $a$.
$$\text{Parametric ReLU}= f(x)= max(ax, x)$$

where $a$ is the slope parameter for negative values.

$$\text{ReLU} \quad\quad\quad \rightarrow \text{Leaky ReLU} \quad\quad\quad\quad\quad \rightarrow \text{Parametric ReLU}$$$$f(x)= max(0, x) \quad\quad \rightarrow f(x)= max(0.1x, x) \quad\quad \rightarrow f(x)= max(ax, x)$$

Screenshot (85).png

The output of the sigmoid function was in the range of 0 to 1, which can be thought of as probability.

But ---
This function faces certain problems.

e.g.
Let'suppose we have five output values of $0.8, 0.9, 0.7, 0.8$ and $0.6$ respectively.

How we can move forward with it ?
The answer is: We can't.
The above values don't make sense as the sum of all teh classes/output probabilities should be equal to $1$.

  • Softmax function is describe as a acombination of multiple sigmoids.
  • It calculate the relative probabilities. Similar to sigmoid/logistic activation function, the Softmax function returns the probabilities of each class.
  • It is most commonly used as an activation function for the last layer of the neural network in the case of multi-clss classification.

Screenshot (86).png

Advantages:¶

  • It can be used for multiclass classification.
  • It normalizes the outputs for each class between 0 and 1, and divides by their sum, giving the probability of the input value being in a specific class.
  • For neural networks that need to categorize inputs into numerous categories, Softmax is often employed exclusively for the output layer.

$$\bf{\text{How to choose an Activation function}}$$¶

General Rule to use Activation functions:¶

Use sigmoid fuction in output layer, use tanh in all other places.
But both of these have vanishing gradient problem.

  • Therefore ReLU can be used in case of both above.
  • ReLU is most popularly used function for hidden layers.
  • Computationally effective math due to simplicity.
  • However, it also has vanishing gradient problem.

Leakly ReLU or Parametric ReLU can be used here then to reduce teh computation.

Screenshot (88).png

Most Widely used Activation functions for hidden layers:

  • Rectified Linear Activation (ReLU)
  • Logistic (Sigmoid)
  • Hyperbolic Tangent (Tanh)

  • ReLU is the most common activation function used for the hidden layers.

  • Simple to implement and effective at overcomming the limitations Sigmoid and Tanh.
  • Less susceptible to vanishing gradients that prevent deep models from being trained.
    • Although it can suffer from other problems like saturated or "dead" units.
    • So we can use Leaky ReLU or Parametric ReLU.
$$\bf{\text{How to choose the best activation function ?}}$$
  • A neural network wwill almost always have the same activation function in all hidden layers.
  • Traditionally, the sigmoid activation function was the default activation function in the 1990s.
  • Perhaps through the mid to late 1990s to 2010s, the Tanh function was the default activation function for hidden layers.
  • Modern neural network models with common architectures, such as MLP and CNN, will make use of the ReLU activation function or extensions.
  • Recurrent networks still commonly use Tanh or Sigmoid activation functions or even both.
    • For example, the LSTM (Long Short Term Memory) commonly uses the Sigmoid activation for recurrent connections and the Tanh activation for output.

The activation function used in hidden layers is typically chosen based on the type of neural network architecture.

  • Multilayer Perceptron (MLP): ReLU activation function.
  • Convolutional Neural Network (CNN): ReLU activation function.
  • Recurrent Neural Network (RNN): Tanh/Sigmoid activation function.
$$\bf{\text{Activation Function for Output Layers}}$$
  • The output layer is the layer in a neural neural network model that directly outputs a prediction.
  • All feed-forward neural network models have an output layer.

Most widely used activation functions for output layer:

  • Linear Activation Function
  • Logistic (Sigmoid)
  • Softmax

The linear activation function is also called "identity" (multiplied by 1.0) or "no activation".
This is because the linear activation function does not change the weighted sum of the input in any way and instead returns the value directly.

How to choose the best activation function ?

  • Based on the type of prediction problem that you are sloving.
    • Specificall, the type of variable that is being predicted.
  • Divide prediction problems into two main groups:
    • Predicting a categorical valriable (Classification)
    • Predicting a numerical variable (Regression)

If your problem is a regression problem, you should use a linear activation function.

  • Regression: One node, linear activation.

If your problem is a classification problem, then there are three main types of classification problems and each may use a different activation function.